Augmentation schemes for particle MCMC

نویسندگان

  • Paul Fearnhead
  • Loukia Meligkotsidou
چکیده

Particle MCMC involves using a particle filter within an MCMC algorithm. For inference of a model which involves an unobserved stochastic process, the standard implementation uses the particle filter to propose new values for the stochastic process, and MCMC moves to propose new values for the parameters. We show how particle MCMC can be generalised beyond this. Our key idea is to introduce new latent variables. We then use the MCMC moves to update the latent variables, and the particle filter to propose new values for the parameters and stochastic process given the latent variables. A generic way of defining these latent variables is to model them as pseudo-observations of the parameters or of the stochastic process. By choosing the amount of information these latent variables have about the parameters and the stochastic process we can often improve the mixing of the particle MCMC algorithm by trading off the Monte Carlo error of the particle filter and the mixing of the MCMCmoves. We show that using pseudo-observations within particle MCMC can improve its efficiency in certain scenarios: dealing with initialisation problems of the particle filter; speeding up the mixing of particle Gibbs when there is strong dependence between the parameters and the stochastic process; and enabling further MCMC steps to be used within the particle filter.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Adaptive Metropolis-Hastings samplers for the Bayesian analysis of large linear Gaussian systems

This paper considers the implementation of efficient Bayesian computation for large linear Gaussian models containing many latent variables. A common approach is to implement a simple MCMC procedure such as the Gibbs sampler or data augmentation, but these methods are often unsatisfactory when the model is large. This motivates the need to develop other strategies for improving MCMC. This paper...

متن کامل

Bayesian Inference for Irreducible Diffusion Processes Using the Pseudo-Marginal Approach

In this article we examine two relatively new MCMC methods which allow for Bayesian inference in diffusion models. First, the Monte Carlo within Metropolis (MCWM) algorithm (O’Neil, Balding, Becker, Serola and Mollison, 2000) uses an importance sampling approximation for the likelihood and yields a limiting stationary distribution that can be made arbitrarily “close” to the posterior distributi...

متن کامل

Group Importance Sampling for Particle Filtering and MCMC

Importance Sampling (IS) is a well-known Monte Carlo technique that approximates integrals involving a posterior distribution by means of weighted samples. In this work, we study the assignation of a single weighted sample which compresses the information contained in a population of weighted samples. Part of the theory that we present as Group Importance Sampling (GIS) has been employed implic...

متن کامل

Particle Markov Chain Monte Carlo for Multiple Change-point Problems

Multiple change-point models are a popular class of time series models which allow the description of temporal heterogeneity in data. We develop efficient Markov Chain Monte Carlo (MCMC) algorithms to perform Bayesian inference in this context. Our so-called Particle MCMC (PMCMC) algorithms rely on an efficient Sequential Monte Carlo (SMC) technique for change-point models, developed in [13], t...

متن کامل

On Multiple Try Schemes and the Particle Metropolis-hastings Algorithm

Markov Chain Monte Carlo (MCMC) methods are well-known Monte Carlo methodologies, widely used in different fields for statistical inference and stochastic optimization. The Multiple Try Metropolis (MTM) algorithm is an extension of the standard Metropolis-Hastings (MH) algorithm in which the next state of the chain is chosen among a set of candidates, according to certain weights. The Particle ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Statistics and Computing

دوره 26  شماره 

صفحات  -

تاریخ انتشار 2016